Okay, so welcome to everybody on the recording.
This will be right.
This course will be uploaded in FA UTV.
The upload place course already exists.
You can find things there. I would like to warn you that I'm doing this for people who can't be here.
Rather than people who are just too lazy to get up tomorrow, Thursday morning at eight, or when it's too rainy.
This is, I hope, and has always been a very interactive course.
Unlike a big frontal course, like the AI lecture or something like this.
This is the course where you actually lose out by not being here.
So yes, we're recording it, but I see this more as an emergency measure than
kind of things you want to make a habit out of.
And we'll try and that's why we're having this smallish course in the way we do it.
To be able to interact.
So this is a research new course, which means
that it's hard.
Just by sitting there and nodding and saying, well, that looks good.
You're not going to learn.
You will get, you will have to get your hands dirty.
And the best is to get your hands and your minds dirty in class because then there are easy access to me and to Frederick, who will be here for this course with me to get answers and see where you're misunderstood directly.
And that's much easier.
Indirect eye contact, then via a little speaker.
All of that being said, we'll try to make it possible to attend this course in the hybrid mode.
Okay.
So, this course.
Talks about natural language and talks about natural language in a completely different way.
Then it's hype to pull.
At the moment, we're not going to look at large language models we're not going to do any machine learning.
Instead,
we're going to look at
essentially how we can construct high quality
representations or language.
And how we can build models how the human mind might actually infer from the world knowledge.
What is actually being said, even for the stuff that are not explicitly represented.
And that's kind of the main thing we're trying to do here.
You may ask yourselves well but isn't this all made obsolete by chat GPT or GPT four or other large language models.
And the answer is we don't know yet.
Certainly, the current
language models can do certain things very well.
Modeling parts of the world and interacting with language over them for things where we have enough data.
So, mostly English, German, French, the couple of other languages, smaller languages,
like Inuit or Sorbian or all of those kind of things are much more of a problem.
We just don't have enough data there for a lifelong language model.
Tiny language models aren't so attractive.
And so, I do think that this course is not obsolete yet.
But I also think thinking about language, rather than letting Microsoft or OpenAI or so, let's think about language is also fun.
Something that's good for you.
So, that's why I still give this course. So we're going to focus on the meaning of natural language, natural language, those languages that are out there.
English, German, Chinese, Sanskrit, whatever.
One thing that I think you're going to see is that natural language is weird and wonderful.
So, I'm going to kind of try to focus on the phenomenon that we're using with natural language, a very, very thin channel of transporting information.
Presenters
Zugänglich über
Offener Zugang
Dauer
01:31:26 Min
Aufnahmedatum
2023-10-17
Hochgeladen am
2023-10-17 19:26:05
Sprache
en-US
These are the recordings of the LBS course of Winter Semester 2023/24.